Data Ingestion from Salesforce to Snowflake using Amazon AppFlow

Amazon AppFlow is a data integration service that lets you ingest data from SaaS applications like Salesforce or ServiceNow into a Snowflake data lake.

Prerequisites

To use Amazon AppFlow for data integration, you must complete the following prerequisites:

  • Configure an Amazon AppFlow instance in the Data Integration section of Cloud Platform Tools & Technologies.

  • Configure Salesforce to use as a data source in the Databases and Data Warehouses section of Cloud Platform Tools & Technologies.

  • Configure Snowflake to use as a data lake in the Databases and Data Warehouses section of Cloud Platform Tools & Technologies.

  • Create a AppFlow to Snowflake connector in the Databases and Data Warehouses section of Cloud Platform Tools &Technologies. This connector is used to connect to the Snowflake data lake.

To create a data integration job using Amazon AppFlow

  1. Click the Amazon AppFlow node in the data integration stage of the pipeline, and click Create Job.

  2. Complete the following steps to create the job:

  3. With this the job creation is complete. You can run the job in the following ways:

    • Click the data integration node and click Start to initiate the job run.

    • Publish the pipeline and then click Run Pipeline.

  4. After the job run is successful, you can view the ingested data from the Lazsa Platform. Click the data lake node and browse to the target table. File Preview shows you the data ingested into the Snowflake data lake.

 

Related Topics Link IconRecommended Topics What's next? Data Integration using Databricks